167 research outputs found

    Quantify resilience enhancement of UTS through exploiting connect community and internet of everything emerging technologies

    Get PDF
    This work aims at investigating and quantifying the Urban Transport System (UTS) resilience enhancement enabled by the adoption of emerging technology such as Internet of Everything (IoE) and the new trend of the Connected Community (CC). A conceptual extension of Functional Resonance Analysis Method (FRAM) and its formalization have been proposed and used to model UTS complexity. The scope is to identify the system functions and their interdependencies with a particular focus on those that have a relation and impact on people and communities. Network analysis techniques have been applied to the FRAM model to identify and estimate the most critical community-related functions. The notion of Variability Rate (VR) has been defined as the amount of output variability generated by an upstream function that can be tolerated/absorbed by a downstream function, without significantly increasing of its subsequent output variability. A fuzzy based quantification of the VR on expert judgment has been developed when quantitative data are not available. Our approach has been applied to a critical scenario (water bomb/flash flooding) considering two cases: when UTS has CC and IoE implemented or not. The results show a remarkable VR enhancement if CC and IoE are deploye

    Understanding the structures of Pier Luigi Nervi: a multidisciplinary approach

    Full text link
    [EN] The paper describes the strategies adopted to carry out the knowledge campaign on Hall C built by Pier Luigi Nervi at Torino Esposizioni, between 1949 and 1950, and belonging to the architectural heritage of the 20th century. The structure was built by combining reinforced concrete and ferrocement elements, thus implementing what for Nervi would later become the distinctive construction system of his artwork, which combined the use of precast in situ and cast-in-place elements.  The extensive review of the historical documentation allowed the identification of the distinctive features and material differences of all structural elements in order to formulate the least invasive testing campaign possible, combining sample extraction with non-destructive testing. The paper aims to illustrate the problems and challenges associated with the creation of interpretive models of the built heritage through the relationship between historical critical investigations and structural diagnosis and is intended to serve as an example for an appropriate phase of investigation aimed at developing guidelines for the conservation of a complex and iconic work.The present work is supported by the Keeping it Modern grant awarded by The Getty Foundation of Los Angeles (USA).Lenticchia, E.; Ceravolo, R.; Faccio, P. (2023). Understanding the structures of Pier Luigi Nervi: a multidisciplinary approach. VITRUVIO - International Journal of Architectural Technology and Sustainability. 8:66-75. https://doi.org/10.4995/vitruvio-ijats.2023.188626675

    Preface

    Get PDF

    CoSMo: a Framework for Implementing Conditioned Process Simulation Models

    Full text link
    Process simulation is an analysis tool in process mining that allows users to measure the impact of changes, prevent losses, and update the process without risks or costs. In the literature, several process simulation techniques are available and they are usually built upon process models discovered from a given event log or learned via deep learning. Each group of approaches has its own strengths and limitations. The former is usually restricted to the control-flow but it is more interpretable, whereas the latter is not interpretable by nature but has a greater generalization capability on large event logs. Despite the great performance achieved by deep learning approaches, they are still not suitable to be applied to real scenarios and generate value for users. This issue is mainly due to fact their stochasticity is hard to control. To address this problem, we propose the CoSMo framework for implementing process simulation models fully based on deep learning. This framework enables simulating event logs that satisfy a constraint by conditioning the learning phase of a deep neural network. Throughout experiments, the simulation is validated from both control-flow and data-flow perspectives, demonstrating the proposed framework's capability of simulating cases while satisfying imposed conditions

    Ontology based recommender system using social network data

    Get PDF
    Online Social Network (OSN) is considered a key source of information for real-time decision making. However, several constraints lead to decreasing the amount of information that a researcher can have while increasing the time of social network mining procedures. In this context, this paper proposes a new framework for sampling Online Social Network (OSN). Domain knowledge is used to define tailored strategies that can decrease the budget and time required for mining while increasing the recall. An ontology supports our filtering layer in evaluating the relatedness of nodes. Our approach demonstrates that the same mechanism can be advanced to prompt recommendations to users. Our test cases and experimental results emphasize the importance of the strategy definition step in our social miner and the application of ontologies on the knowledge graph in the domain of recommendation analysis

    Tailoring Machine Learning for Process Mining

    Full text link
    Machine learning models are routinely integrated into process mining pipelines to carry out tasks like data transformation, noise reduction, anomaly detection, classification, and prediction. Often, the design of such models is based on some ad-hoc assumptions about the corresponding data distributions, which are not necessarily in accordance with the non-parametric distributions typically observed with process data. Moreover, the learning procedure they follow ignores the constraints concurrency imposes to process data. Data encoding is a key element to smooth the mismatch between these assumptions but its potential is poorly exploited. In this paper, we argue that a deeper insight into the issues raised by training machine learning models with process data is crucial to ground a sound integration of process mining and machine learning. Our analysis of such issues is aimed at laying the foundation for a methodology aimed at correctly aligning machine learning with process mining requirements and stimulating the research to elaborate in this direction.Comment: 16 page

    Comparing Concept Drift Detection with Process Mining Software

    Get PDF
    Organisations have seen a rise in the volume of data corresponding to business processes being recorded. Handling process data is a meaningful way to extract relevant information from business processes with impact on the company's values. Nonetheless, business processes are subject to changes during their executions, adding complexity to their analysis. This paper aims at evaluating currently available process mining tools and software that handle concept drifts, i.e. changes over time of the statistical properties of the events occurring in a process. We provide an in-depth analysis of these tools, comparing their differences, advantages, and disadvantages by testing against a log taken from a Process Control System. Thus, by highlighting the trade-off between the software, the paper gives the stakeholders the best options regarding their case use

    CHAMALEON: Framework to improve Data Wrangling with Complex Data

    Get PDF
    Data transformation and schema conciliation are relevant topics in Industry due to the incorporation of data-intensive business processes in organizations. As the amount of data sources increases, the complexity of such data increases as well, leading to complex and nested data schemata. Nowadays, novel approaches are being employed in academia and Industry to assist non-expert users in transforming, integrating, and improving the quality of datasets (i.e., data wrangling). However, there is a lack of support for transforming semi-structured complex data. This article makes an state-of-the-art by identifying and analyzing the most relevant solutions that can be found in academia and Industry to transform this type of data. In addition, we propose a Domain-Specific Language (DSL) to support the transformation of complex data as a first approach to enhance data wrangling processes. We also develop a framework to implement the DSL and evaluate it in a real-world case study

    Toward Sensor-Based Context Aware Systems

    Get PDF
    This paper proposes a methodology for sensor data interpretation that can combine sensor outputs with contexts represented as sets of annotated business rules. Sensor readings are interpreted to generate events labeled with the appropriate type and level of uncertainty. Then, the appropriate context is selected. Reconciliation of different uncertainty types is achieved by a simple technique that moves uncertainty from events to business rules by generating combs of standard Boolean predicates. Finally, context rules are evaluated together with the events to take a decision. The feasibility of our idea is demonstrated via a case study where a context-reasoning engine has been connected to simulated heartbeat sensors using prerecorded experimental data. We use sensor outputs to identify the proper context of operation of a system and trigger decision-making based on context information
    corecore